Risk-Based Generalizations of f-divergences

نویسندگان

  • Dario García-García
  • Ulrike von Luxburg
  • Raúl Santos-Rodríguez
چکیده

We derive a generalized notion of f divergences, called (f, l)-divergences. We show that this generalization enjoys many of the nice properties of f -divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Investigation to Reliability of Optical Communication Links using Auto-Track Subsystems in Presence of Different Beam Divergences

In this paper, we investigate the effects of auto-tracking subsystem together with different beam divergences on SNR, BER and stability of FSO communication links. For this purpose we compute the values of power, SNR and BER on receiver, based on analytic formula of Gaussian beam on receiver plane. In this computation the atmospheric effects including absorption, scattering and turbulence are c...

متن کامل

Normalized information-based divergences

This paper is devoted to the mathematical study of some divergences based on the mutual information well-suited to categorical random vectors. These divergences are generalizations of the " entropy distance " and " information distance ". Their main characteristic is that they combine a complexity term and the mutual information. We then introduce the notion of (normalized) information-based di...

متن کامل

Metrics Defined by Bregman Divergences †

Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...

متن کامل

On Shore and Johnson properties for a Special Case of Csiszár f-divergences

—The importance of power-law distributions is attributed to the fact that most of the naturally occurring phenomenon exhibit this distribution. While exponential distributions can be derived by minimizing KL-divergence w.r.t some moment constraints, some power law distributions can be derived by minimizing some generalizations of KL-divergence (more specifically some special cases of Csiszár f-...

متن کامل

Generalization of the de Bruijn's identity to general $\phi$-entropies and $\phi$-Fisher informations

In this paper, we propose generalizations of the de Bruijn’s identities based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundation of these generalizations are the φ-entropies and divergences of the Csiszár’s class (or Salicrú’s class) considered within a multidimensional context, included the monodimensional case, and fo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011